Improved Neural Networks Based on Mutual Information via Information Geometry
نویسندگان
چکیده
منابع مشابه
On Classification of Bivariate Distributions Based on Mutual Information
Among all measures of independence between random variables, mutual information is the only one that is based on information theory. Mutual information takes into account of all kinds of dependencies between variables, i.e., both the linear and non-linear dependencies. In this paper we have classified some well-known bivariate distributions into two classes of distributions based on their mutua...
متن کاملInformation Geometry of Neural Networks
A neural network is an information processing system composed of neurons or neuronlike elements. There are two di erent architectures of neural networks. One is a feedforward type network or a multilayer perceptron (MLP), and the other is a network with recurrent connections. These networks are speci ed by a set of parameters called connection weights = (wij), which are usually modi able. There...
متن کاملMutual Information-Based Modified Randomized Weights Neural Networks
Randomized weights neural networks have fast learning speed and good generalization performances with one single hidden layer structure. Input weighs of the hidden layer are produced randomly. By employing certain activation functions, outputs of the hidden layers are calculated with some randomization. Output weights are computed using pseudoinverse. Mutual information can be used to measure m...
متن کاملA New Unequal Error Protection Technique Based on the Mutual Information of the MPEG-4 Video Frames over Wireless Networks
The performance of video transmission over wireless channels is limited by the channel noise. Thus many error resilience tools have been incorporated into the MPEG-4 video compression method. In addition to these tools, the unequal error protection (UEP) technique has been proposed to protect the different parts in an MPEG-4 video packet with different channel coding rates based on the rate...
متن کاملMine: Mutual Information Neural Estimation
We argue that the estimation of the mutual information between high dimensional continuous random variables is achievable by gradient descent over neural networks. This paper presents a Mutual Information Neural Estimator (MINE) that is linearly scalable in dimensionality as well as in sample size. MINE is backpropable and we prove that it is strongly consistent. We illustrate a handful of appl...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Algorithms
سال: 2019
ISSN: 1999-4893
DOI: 10.3390/a12050103